A Sequential Metropolis - Hastingsalgorithmpierre

نویسنده

  • Pierre Vandekerkhove
چکیده

This paper deals with the asymptotic properties of the Metropolis-Hastings algorithm, when the distribution of interest is unknown, but can be approximated by a sequential estimator of its density. We prove that, under very simple conditions, the rate of convergence of the Metropolis-Hastings algorithm is the same as that of the sequential estimator when the latter is introduced as the reversible measure for the Metropolis-Hastings Kernel. This problem is a natural extension of previous a work on a new simulated annealing algorithm with a sequential estimator of the energy.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Examples comparing Importance Sampling and the Metropolis algorithm

Importance sampling, particularly sequential and adaptive importance sampling, have emerged as competitive simulation techniques to Markov–chain Monte–Carlo techniques. We compare importance sampling and the Metropolis algorithm as two ways of changing the output of a Markov chain to get a different stationary distribution.

متن کامل

Sequentially Interacting Markov Chain Monte Carlo Methods

We introduce a novel methodology for sampling from a sequence of probability distributions of increasing dimension and estimating their normalizing constants. These problems are usually addressed using Sequential Monte Carlo (SMC) methods. The alternative Sequentially Interacting Markov Chain Monte Carlo (SIMCMC) scheme proposed here works by generating interacting non-Markovian sequences which...

متن کامل

Embarrassingly parallel sequential Markov-chain Monte Carlo for large sets of time series

Bayesian computation crucially relies on Markov chain Monte Carlo (MCMC) algorithms. In the case of massive data sets, running the Metropolis-Hastings sampler to draw from the posterior distribution becomes prohibitive due to the large number of likelihood terms that need to be calculated at each iteration. In order to perform Bayesian inference for a large set of time series, we consider an al...

متن کامل

A Comparison of Traditional Meth.ods and Sequential Bayesian Meth.ods for Blind Deconvolution Problems

This work concerns sequential techniques for the canonical blind deconvolution problem in communications signal processing, relating to the estimation of the transmitted (discrete-valued) data sequence from the observed signal at the receiver input, in the presence of unknown linear channel filtering, without recourse to extended training sequences for start-up. This problem has a significant h...

متن کامل

Austerity in MCMC Land: Cutting the Metropolis-Hastings Budget

A. Distribution of the test statistic In the sequential test, we first compute the test statistic from a mini-batch of size m. If a decision cannot be made with this statistic, we keep increasing the mini-batch size by m datapoints until we reach a decision. This procedure is guaranteed to terminate as explained in Section 4. The parameter ✏ controls the probability of making an error in a sing...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1998